专利摘要:
Embodiments of the present description refer in general to the generation and use of three-dimensional terrain maps for vehicular control. Other embodiments can be described and / or claimed.
公开号:BR112020008778A2
申请号:R112020008778-5
申请日:2018-10-31
公开日:2020-10-20
发明作者:Tommy Ertboelle Madsen;Tri M. Dang;Joshua M. Gattis;Andreas F. Ramm;Eran D. B. Medagoda;Steven J. Dumble;Bilal ARAIN
申请人:Agjunction Llc;
IPC主号:
专利说明:

[0001] [0001] This application claims priority to the US Provisional Application for Serial No. 62 / 579,515 filed on October 31, 2017, entitled: TERRAIN MAPPING, incorporated by reference in its entirety. COPYRIGHT NOTICE
[0002] [0002] A portion of the disclosure of this patent document contains material subject to copyright protection. The copyright owner does not object to anyone's facsimile reproduction of the patent document or patent disclosure, as it appears in the United States Patent and Trademark Office's patent or trademark file or registrations, but otherwise way, reserves all copyrights. TECHNICAL FIELD
[0003] [0003] Accomplishments of this disclosure refer in general to the generation and use of three-dimensional terrain maps for vehicular control. Other embodiments can be described and / or claimed. BACKGROUND
[0004] [0004] Vehicle control systems can be used to automatically or semi-automatically move a vehicle along a desired route. Three-dimensional terrain maps are maps that describe the topography of an area of terrain, including natural aspects (such as rivers, mountains, hills, ravines, etc.) and other objects associated with the terrain (such as vehicles, fences, power lines, etc.). Among other things, embodiments of the present disclosure describe the generation and use of three-dimensional terrain maps in conjunction with vehicle control systems. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] [0005] The included drawings are for illustrative purposes and serve to provide examples of possible structures and operations for the inventive computer-readable systems, apparatus, methods and storage media disclosed. These drawings in no way limit any changes in form and details that can be made by one skilled in the art without departing from the spirit and scope of the disclosed implementations.
[0006] [0006] Figure 1A is a block diagram of an example of a vehicle control system in accordance with various aspects of the present disclosure.
[0007] [0007] Figure 1B is a block diagram that illustrates an example of components of a control system according to various aspects of the present disclosure.
[0008] [0008] Figure 2 illustrates an example of a vehicle control system coupled to a vehicle.
[0009] [0009] Figure 3 is a flow diagram that illustrates an example of a process according to various embodiments of the present disclosure.
[0010] [0010] Figure 4 is a flow diagram that illustrates an example of another process according to various embodiments of the present disclosure.
[0011] [0011] Figure 5 is a flow diagram that illustrates an example of yet another process according to various embodiments of the present disclosure.
[0012] [0012] Figure 6 is a flow diagram that illustrates an example of yet another process according to various embodiments of the present disclosure.
[0013] [0013] Figures 7-11 are vehicle diagrams that illustrate various embodiments of the present disclosure. DETAILED DESCRIPTION TI. SYSTEM EXAMPLES
[0014] [0014] Figure 1A is a block diagram of a vehicle 50 that includes a vehicle control system 100 for controlling various functions of the vehicle 50, including the direction of the vehicle. The vehicle control system can also be used in conjunction with the generation of a 3D terrain map, as described below (for example, with reference to the method described in Figure 3). In the example shown in Figure 1A, control system 100 includes a camera system that uses one or more sensors, such as cameras 102, to identify aspects 104 in a field of view 106. In alternative embodiments, the sensors can be positioned in any desired configuration around vehicle 50. For example, in addition to facing forward, cameras 102 can also be positioned on the sides or rear of vehicle 50. Sensors can also be configured to provide a 360-degree coverage angle degrees around the vehicle, like an omnidirectional camera that captures a 360-degree viewing image.
[0015] [0015] In the example shown in Figure 1A, vehicle control system 100 operates in conjunction with a global satellite navigation system (GNSS) 108 and an inertial measurement unit (IMU)
[0016] [0016] The control system 100 can also use data from the sensors (including optical sensors, such as cameras 102) to create a map of an area using a mapping (SLAM) and simultaneous location process. Terrain characteristics 104 can be represented on the 3D map. The map can be geographically located (also known as "geographic location") with GNSS 108 data. In some embodiments, the 3D map can be stored online for access and updating by the various vehicles working in an area (for example, agricultural vehicles working in the same field).
[0017] [0017] Figure 1B illustrates an example of the components of a B100 control system. In some embodiments, the components of the B100 control system can be used to implement a vehicle control system (such as the systems represented in Figure 1A and Figure 2), a terrain mapping system (for example, to generate a road map). terrain 3D) or a vehicle implement control system, as mentioned in more detail below. Likewise, the B100 control system can be used to implement or in conjunction with the methods described in Figures 3-6.
[0018] [0018] In this example, the B100 control system includes a Bl110 processor in communication with a B120 memory, B130 sensor system, Bl40 positioning system, B150 user interface and a Bl160 transceiver. The B100 system can include any number of different processors, memory components, sensors, user interface components, and transceiver components, and can interact with any other desired systems and devices in conjunction with embodiments of the present disclosure. Alternative embodiments of the B100 control system may have more or less components than those shown in the example shown in Figure 1B.
[0019] [0019] The functionality of the control system B100, including the steps of the methods described below (in whole or in part), can be implemented through the Bl110 processor that executes computer-readable instructions stored in the B1l20 memory of the B100 system. The B120 memory can store any computer-readable instructions and data, including software applications and built-in operational code. Portions of the functionality of the methods described here can also be realized by means of software operating on one or more other computing devices in communication with the B100 control system (for example, the B160 transceiver).
[0020] [0020] The functionality of the B1l00 system or other system and devices that operate in conjunction with embodiments of the present disclosure can also be implemented through various hardware components that store machine-readable instructions, such as application-specific integrated circuits (ASICs) , field programmable gate arrays (FPGA) and / or complex programmable logic devices (CPLD). Systems according to aspects of certain embodiments can operate in conjunction with any desired combination of software and / or hardware components.
[0021] [0021] Any type of B110 processor, such as an integrated circuit microprocessor, the microcontroller, and / or digital signal processor (DSP), can be used in conjunction with embodiments of the present disclosure. A B120 memory operating in conjunction with embodiments of the disclosure may include any combination of different memory storage devices, such as hard drives, random access memory (RAM), read-only memory (ROM), FLASH memory, or any other type of volatile and / or non-volatile memory. The data can be stored in the Bl120 memory in any desired way, such as in a relational database.
[0022] [0022] The Bl130 sensor system can include a variety of different sensors, including sensors to analyze the terrain around a vehicle, such as an imaging device (for example, a camera or optical sensor), a radar sensor and / or a sensor deal. The B130 sensor system can also include sensors to determine characteristics in relation to a vehicle or terrain, such as an accelerometer, a gyroscopic sensor and / or a magnetometer.
[0023] [0023] The B1l40 positioning system can include a variety of different components to determine the position of a vehicle. For example, the positioning system may include a global satellite navigation system (GNSS), a local positioning system (LPS) and / or an inertial navigation system (INS).
[0024] [0024] The B100 system includes a B150 user interface that can include any number of input devices (not shown) for receiving commands, data and other suitable input. The B150 user interface can also include any number of output devices (not shown) to provide the user with data (such as a visual display of a 3D terrain map and a route to be followed by a vehicle), alerts / notifications, and other information. Typical I / O devices can include display screens, mice, keyboards, printers, scanners, video cameras and other devices.
[0025] [0025] The Bl60 transceiver can include any number of communication devices (such as wireless or wired transceivers, modems, network interfaces, etc.) to allow the Bl0OO system to communicate with one or more computing devices, as well like other systems. The B100 control system can be, include, or operate in conjunction with, a laptop computer, a desktop computer, a mobile subscriber communication device, a mobile phone, a personal digital assistant (PDA), a tablet computer, an electronic book or book reader, digital camera, video camera, video game console, and / or any other suitable computing device.
[0026] [0026] The Bl60 transceiver can be adapted to communicate using any electronic communication system or method. Communication between components that operate in conjunction with embodiments of the present disclosure can be performed using any appropriate communication method, such as, for example, a telephone network, an extranet, an intranet, the Internet, wireless communications, transponder communications , local area network (LAN), wide area network (WAN), virtual private network (VPN), network or connected devices, and / or any suitable communication format.
[0027] [0027] Although some embodiments can be implemented in fully functioning computers and computer systems, several embodiments are capable of being distributed as a computing product in a variety of ways and are capable of being applied regardless of the particular type of machine or medium computer readable codes used to actually distribute.
[0028] [0028] A tangible and non-transitory computer-readable medium can be used to store software and data that, when executed by a system, cause the system to perform several operations described here. Executable software and data can be stored on various types of computer-readable media, including, for example, ROM, volatile RAM, non-volatile memory and / or cache. Other examples of computer-readable media include, but are not limited to, writable and non-writable media, such as volatile and non-volatile memory devices, read-only memory (ROM), random access memory (RAM) , flash memory devices, disk storage media, optical storage media (for example, Read-Only Compact Disk Memory (CD ROMS), digital versatile discs (DVD), etc.), among others.
[0029] [0029] Figure 2 shows another example of a vehicle control system 210. In this example, vehicle control system 210 includes a GNSS receiver 4 comprising an RF converter (i.e., down converter) 16, a tracking 18 and an RTK rover 20 receiving element. Receiver 4 communicates electrically with, and provides GNSS positioning data to, orientation processor 6. Orientation processor 6 includes a graphical user interface (GUI) 26, a microprocessor 24 and a media element 22, such as a memory storage unit. The guidance processor 6 communicates electrically with and provides control data to a steering control system 166 (also referred to herein as an "automatic steering system") to control vehicle operation. The automatic steering system 166 includes a handwheel motion detection switch 28 and an encoder 30 for interpreting guidance and steering commands from the CPU 6.
[0030] [0030] The automatic steering system 166 can mechanically interact with the vehicle's steering column 34, which is mechanically attached to the steering wheel 32. A control line 42 can transmit guidance data from CPU 6 to the automatic steering system 166. An electrical subsystem 44, which energizes the electrical needs of the vehicle 100, can interact directly with the automatic steering system 166 via a power cable 46. The automatic steering subsystem 166 can be mounted on the steering column 34 near the vehicle floor and in the vicinity of vehicle control pedals 36. Alternatively, the automatic steering system 166 can be mounted at other locations along the steering column 34.
[0031] [0031] The automatic steering system 166 physically drives and directs the vehicle 100 or 110 by actively turning the steering wheel 32 through the steering column 34. A motor 45 powered by the vehicle's electrical subsystem 44 can drive a helical drive that drives a helical gear 48 affixed to the automatic steering system 166. These components are preferably enclosed in a housing. In other embodiments, theThe automatic steering system 166 is integrated directly into the vehicle's drive control system, regardless of steering column 34. IT. THREE-DIMENSIONAL LAND MAPPING
[0032] [0032] Embodiments of the present disclosure can be used to generate three-dimensional (3D) terrain maps (also known as three-dimensional elevation models). These maps can be generated using data from various sources, such as satellite images, survey using a global satellite navigation system (GNSS), as a global positioning system (GPS), survey using radar or dealing, using images and data from sensors captured from land vehicles, aerial images of airplanes or drones and other data. The different method will have different spatial and height resolutions.
[0033] [0033] Figure 3 illustrates a method 300 for generating a 3D terrain map according to various aspects of the present disclosure. In this example, method 300 includes identifying, by a terrain mapping system (for example, implemented by the B100 control system in Figure 1B), a topography of the soil surface for a section of the terrain (305), identifying a topography of the land vegetation in the terrain section (310), generate a two-dimensional representation of a 3D terrain map, including the topography of the soil surface for the terrain section and the vegetation topography in the terrain section, display the 3D terrain map (320 ) and transmit the 3D terrain map to another system or device (325).
[0034] [0034] In method 300, the system can identify a topography of the soil surface for a section of land (305) based on data received from a sensor system (for example, Bl130 sensor system in Figure 1B) and positioning system
[0035] [0035] Method 300 also includes identifying (for example, based on data received from the sensor system and the positioning system) a vegetation topography over the terrain section (310).
[0036] [0036] Method 300 includes generating a two-dimensional representation of a three-dimensional terrain map that includes the topography of the soil surface for the terrain section and the vegetation topography over the terrain section. In some embodiments, The terrain mapping system that implements method 300 in Figure 3 includes a display screen (for example, as part of the B150 user interface in Figure 1B), and terrain mapping system displays (320) of the three-dimensional terrain map on the display screen.
[0037] [0037] The system can identify a plurality of objects within the terrain section and provide visual indicators for each object on the 3D terrain map. In addition, the 3D map can include a respective visual indicator on the map for each respective object that represents the object that is passable by the vehicle. For example, the 3D map may include color-coded objects, with red colors indicating impassable / non-passable objects, green colors indicating passable objects and yellow indicating that a human operator must authorize the plotting of a route over / through that object by a vehicle.
[0038] [0038] In some embodiments, the generation of the three-dimensional terrain map includes the identification of a height of a portion of the vehicle on the ground surface. Among other things, theThe system can determine a depth of traces made by the vehicle, based on a change in the height of the vehicle portion above the ground surface.
[0039] [0039] In some embodiments, for example, a 3D sensor system can be used to measure the surface of the terrain in relation to the sensor mounting pose on the vehicle. In some embodiments, the sensor system may include a rotary handle system adapted to scan a number of laser beams around the sensor's Z axis at a high frequency. Additionally or alternatively, the sensor system may include an array of static laser beams, a stereo camera based on two or more cameras, or another 3D imaging or scanning device.
[0040] [0040] In some embodiments, 3D sensors can provide information when there is no previous GNSS information of available height / terrain model, as well as providing very detailed maps (for example, with a resolution of about 2 cm). Embodiments of the present disclosure can use GNSS to avoid deviations in the measurement of vegetation height or other characteristics of the terrain. In some cases, especially if high-precision GNSS is not available, the system may use previously generated elevation map data, especially if they have better accuracy than GNSS. Figure 7 illustrates an example of a vehicle with 3D sensors that comprises a laser beam array to determine the height of crops planted in a section of the ground in relation to the soil surface.
[0041] [0041] In some embodiments, the system can identify the height of vegetation above the soil surface during periods when a vehicle is driving between crop lines in such a way that the crop edge can be more visible (for example, due to the absence of cultures between the lines, or sparse stubble from previous cultures).
[0042] [0042] By using existing 3D terrain maps together with sensor readings, the system helps to create a better estimate of the current terrain and be able to better accommodate for changes. This can help to improve steering performance, provide valuable advances for height control of wide attachments so that the height can be smoothly adjusted and / or prevent damage. Embodiments of this disclosure can also be used to help accelerate or decelerate the vehicle (for example, through an automatic or semi-automatic vehicle control system) to increase operator comfort, or to traverse a rough terrain. to reduce stress on vehicles and tools. In some cases, the system can also plot a new route for the vehicle to avoid an area.
[0043] [0043] In some embodiments, the system can be used to detect that a vehicle is sinking into the ground based on parameters such as tire pressure or load on the vehicle, and the height of the GNSS antenna above the ground plane. For example, a 3D sensor measurement can be used to detect the actual height of the GNSS antenna above the ground surface. If the vehicle is sinking into the ground, the change in antenna height can be used to measure the depth of the trails made by the vehicle to determine the degree to which the vehicle is sinking into the ground. Figure 8, for example, represents the height of crops in relation to the ground, as well as the depth below ground level of the tracks made by the vehicle.
[0044] [0044] In some embodiments, the sensor system may include a camera capturing two-dimensional (2D) images. Images can have a variety of different resolutions or other characteristics. For example, the camera can capture images in the visible human spectrum (for example, red-green-blue or “RGB” images) or other wavelengths of interest. In another example, images can be captured in an infrared (IR) or near infrared (NIR) spectrum. For 3D agricultural land maps, for example, embodiments of the present invention can use NIR images, since the NIR reflectance of plants is often high and plant health indexes, such as a normalized difference vegetation index (NDVI ), can be calculated based on 3D maps generated using NIR image data.
[0045] [0045] The 3D terrain map can be generated using data from several different sources. For example, the system can generate the 3D map by merging clouds of terrain points with GNSS and IMU data for a detailed 3D terrain map in a global reference structure.
[0046] [0046] Embodiments of this disclosure can generate 3D terrain maps for particular use in farm / agricultural applications. For example, generating a three-dimensional terrain map may include determining the height of a portion of vegetation (eg, crops) on the ground above the surface of the soil to help determine if a crop is ready for harvest to identify shrubs that they may need to be cleared from a field before planting, to assess crop health and other uses.
[0047] [0047] The system can also use data (in real time or near real time) from the positioning system and / or sensor system to identify discrepancies in a pre-existing 3D terrain map and update the 3D terrain map accordingly . For example, the system can modify a pre-existing feature of a pre-existing three-dimensional terrain map based on the topography of the soil surface for the terrain section and the vegetation topography over the terrain section to reflect, for example, growing or harvesting crops on the ground.
[0048] [0048] In some embodiments, the terrain mapping system can identify a humidity level in a section of terrain described on a 3D terrain map, and provide information on humidity. For example, the system can identify a first level of moisture in a first portion of the section of land (for example, a relatively dry portion of a field), and identify a second level of humidity in a second portion of the section of land
[0049] [0049] Likewise, the system can identify a body of water in the section of land, such as a puddle, pond, lake, river, or stream, as well as determine whether a particular vehicle is capable of crossing the body of water. In determining transposibility, the system can determine a rate of water flow through the water body, as well as a depth of the water body. In cases where the water body cannot be crossed, the system can identify (for example, visually on the 3D terrain map) a route for the vehicle to bypass the body of water.
[0050] [0050] The system can indicate a variety of different features on the 3D terrain map. In addition to natural characteristics (for example, mountains, streams, trees, ravines, etc.), the system can indicate artificial characteristics, such as fences, power lines, roads, etc. In some embodiments, the system can indicate a route for one or more vehicles on the 3D terrain map. For example, the system can draw wheel tracks on the map (for example, using a specific color of lines) to represent the route to be traveled by a vehicle. Track lines can be spaced based on the vehicle's wheel base.
[0051] [0051] In some embodiments, for a map generated using GNSS data captured from a vehicle that crosses a section of terrain, there can only be measurements from where the vehicle was driven. The rest of the map can thus be determined by the system based on measurements from the vehicle's positioning systems / sensors. Depending on how the field is explored, such measurements can be very dense or very scarce (for example, controlled traffic agriculture, where there are only lanes every 12 meters).
[0052] [0052] The system can transmit (for example, using The B1I60 transceiver in Figure 1B) an electronic communication comprising the three-dimensional map of another system "or device, such as a vehicle control system. For example, the system can transmit the 3D terrain map for a plurality of other vehicles operating in the same area (for example, within the same field) to allow vehicles to coordinate their routes and operations.
[0053] [0053] Embodiments of this disclosure may use updated data to help continually improve the accuracy of 3D terrain maps. For example, the system can map the environment (for example, based on current GNSS / INS automatic steering systems) and then continually update the terrain model shown on the 3D map based on data from sensors coupled to one or more vehicles crossing the terrain.
[0054] [0054] In some embodiments, the system can continuously compute all the sensor inputs and performance parameters of the system and transmit them to another system (for example, a cloud service) that can analyze the data from various vehicles. By receiving information from various vehicles and training forecasting models based on such data, disclosure embodiments can help vehicle control systems to deal with more difficult scenarios without human intervention, thereby providing improvements over autonomous vehicle control systems or conventional semiautonomists.
[0055] [0055] In some cases, the 3D terrain map can be based on a variety of information from different sensors. Such information may include, for example, 3D point cloud data, images, GNSS data, INS data, speed data for a vehicle (for example, based on wheel speed), vehicle characteristics (for example, tire) and other information.
[0056] [0056] The 3D terrain map can also be generated based on data from other sources, such as historical data (for example, previously generated terrain maps), weather information and terrain information, such as soil information, information about depreciation, expected water evaporation based on soil type, etc. In this way, embodiments of the present disclosure can help to make better plans for performing tasks, as well as improving the system's ability to deal with unforeseen scenarios.
[0057] [0057] Embodiments of the present disclosure can also use machine learning to optimize maps using sensor input analysis algorithms and controllers to improve performance. The system can also provide updated maps and revised algorithms to maintain the accuracy of the system. III. OPTIMIZATION OF VEHICLE CONTROL
[0058] [0058] Among other things, embodiments of the present disclosure may use 3D terrain maps to help improve the driving performance of vehicle control systems, particularly on uneven or undulating terrain. For example, 3D terrain maps can be used to help plan routes for vehicles driving on a side elevation (for example, which elevation should be anticipated). In another example, the system can use slip predictions to improve direction (for example, in curves).
[0059] [0059] In addition, in cases where a vehicle is coupled to a vehicle implement (for example, a tractor towing a plow or disc), the direction of a passive implement can be determined in relation to the vehicle in such a way that a route can be planned to compensate for the agricultural pass through the turning points of the land, such as terrace roofs or canal valleys. Often, these areas can show major step-by-step errors unless the driver takes on to move the vehicle's location. Embodiments of the present disclosure, on the other hand, can provide better positioning step by step, even in undulating conditions, using 3D terrain map information, as well as data from the sensor system and positioning systems coupled to the vehicle. In Figure 9, for example, the system can identify the dimensions of the section of undulating terrain to be traversed by a vehicle to plan the vehicle's route to cover the undulation section optimally (for example, using three passes corresponding to the three sections segments shown in Figure 9, in this example).
[0060] [0060] Figure 4 illustrates an example of a method 400 that can be implemented by a vehicle control system (for example, the systems represented in Figures 1A, 1B and / or 2). In this example, method 400 includes determining a vehicle's position (for example, coupled to the vehicle control system) based on location data from a positioning system (405), identifying a 3D terrain map associated with the position of the vehicle; (410), determine a route for the vehicle based on the 3D terrain map (415), identify a terrain characteristic based on data from a sensor system (420), modify or maintain the vehicle path based on on the identified terrain characteristic (425) and display the 3D terrain map (430).
[0061] [0061] In some embodiments, the system that implements method 400 may include a steering control system (such as steering system 166 in Figure 2) to control vehicle operation. In some embodiments, the steering control system can be implemented as a component of a user interface (for example, user interface B150 in Figure 1B). The steering control system can be adapted to drive and guide the vehicle along the given route.
[0062] [0062] The system can also include a display (for example, as a B150 user interface component in Figure 1B) and the system can display (430) a two-dimensional representation of the three-dimensional map on the display. Likewise, the system can show a visual representation of the route in conjunction with the three-dimensional map display on the display.
[0063] [0063] The system can determine a route for the vehicle (415) based on a variety of factors and criteria. For example, the system can generate a path for an agricultural vehicle (such as a tractor) coupled with a vehicle implement (such as a seeder) to cross a section of the land (such as a field to be sown).
[0064] [0064] The system can identify one or more characteristics of the terrain (420) associated with a section of the terrain at any suitable time, including during the initial generation of the route or after the vehicle begins to cross the route. The system can analyze the terrain characteristics identified to determine the vehicle's path (415), as well as modify or maintain (425) an existing path for a vehicle. For example, the system can identify a terrain feature comprising an elevation, identify an inclination of the elevation and determine whether the elevation of the terrain characteristic is capable of crossing the vehicle. In some embodiments, the system may stop the operation of a steering control system, controlling the vehicle automatically or semi-automatically, in response to the identification of one or more features of the terrain (for example, handing control over to a human operator).
[0065] [0065] In many cases, it is common for vehicle attachments (such as sprayers) to travel over, over, through undulating obstacles such as terraces and drainage channels. These obstacles can move transient motion off the desired path as the vehicle's control system tries to react quickly to changes in terrain. For small or short obstacles, it would be best if the vehicle's control system did nothing to compensate for the disturbance of obstacles, as disruption of the route traveled is minimized by allowing the vehicle to drive in a straight line instead of taking large measures of control to compensate for the disturbance. Control compensation can cause transient effects that can persist for longer than the transient effects of obstacles if no corrective control action has been taken. In some embodiments, the system can analyze the characteristics of a 3D terrain map to identify the duration of such disturbances and minimize the amount of corrections it tries to make based on what could be a potentially large error in GNSS and INS sensor feedback. .
[0066] [0066] Accomplishments of the present disclosure can thus provide automatic or semi-automatic steering systems that assess the terrain to be covered by a vehicle based on historical map data (for example, from a 3D terrain map) and / or from data collected from the sensor in real time or near real time. On the other hand, conventional systems can only measure the current pose of the vehicle and conventional controllers can continuously try to get the vehicle on the route, often leading to the control reaction being too late and, in some cases, not optimal considering the duration of the disturbance . This could be a longer change due to the slope versus a very short change due to a smaller hole, or a short period of time involved in crossing a ditch.
[0067] [0067] For example, if the system identifies characteristics of a terrain, such as a hole or ditch, the system may use data from a sensor system (for example, including a Lidar sensor and / or image capture device) to evaluate if the terrain characteristic is acceptable, then modify the route, speed or other characteristic of the vehicle (if necessary) in order to traverse the terrain optimally.
[0068] [0068] In this way, embodiments of the present disclosure help to improve the performance and response time of vehicle control systems, especially when running at high speed. Embodiments of the present disclosure can use measurements from a sensor (for example, a measured ripple ripple after hitting a bump) to determine the roughness coefficient for the surface of the terrain, thus helping to identify the terrain with clumps of dirt, rocks or other small characteristics that may be passable by the vehicle, but that can guarantee passage over them at a reduced speed.
[0069] [0069] In some cases, when a vehicle (such as a tractor) is driving on sloping ground, the angles of inclination and rotation that the vehicle experiences may change with the direction that the vehicle's body is facing. For example, if the vehicle is facing up from the elevation, then the vehicle is launched upward, if the vehicle is traveling along the elevation, then the vehicle is rolled to one side.
[0070] [0070] When the expected elevation of the ground is known to the control system through the analysis of a 3D terrain map, the system can correlate the current vehicle bearing and rotation angles with the expected inclination and rotation angles, thus allowing the system to calculate a measurement of the position of the vehicle body. This heading measurement can be merged with other sensor data to help provide a better estimate of vehicle status, improving the robustness and accuracy of the control system's performance.
[0071] [0071] In addition, vehicle control (and vehicle implement) can be improved by embodiments of the present disclosure, for example, using a 3D terrain map to predict future changes or terrain disturbances that the vehicle may encounter. Such future information can be used to allow the vehicle to take preemptive control actions to minimize the effect of a future terrain change or disturbance.
[0072] [0072] The system can determine the path of a vehicle based on a task to be performed by the vehicle or on a desired result of the vehicle crossing the path. For example, the system can determine the vehicle's path to help optimize water management, provide vehicle operator safety on mountainous or sloping terrain (for example, from rolling the vehicle) and account for land leveling and erosion ( for example, tracking how the land is changing over time to plan the use of terraces).
[0073] [0073] In addition, the system can plan routes for vehicles - “traveling on an elevation in comparison to the elevation / descent of the elevation, in order to save fuel. The system can additionally update the 3D terrain map as the vehicle travels along the route (for example, to identify boundaries, hay bales, obstacles) to help improve the accuracy of the map. In addition, embodiments of the present disclosure can be used to enhance the capability of control systems for vehicles with limited positioning systems (for example, GNSS only) by using the vehicle positioning system information in conjunction with the information on the vehicle map. 3D terrain.
[0074] [0074] In some embodiments, the system can plan the route for the vehicle based on the 3D terrain map in order to help segment a non-convex field and determine the driving direction in that field for optimal coverage (for example, based on fuel usage and time). The system can also plan the route of a vehicle in such a way that the line between two steps with an implement (such as a seeder) is constant, even if the terrain is undulating to help provide better coverage in the field and allow the plan the use of their fields in the best possible way.
[0075] [0075] The system can use information from the 3D terrain map and information from a sensor system to detect the promontory of a field in order to determine the route for a vehicle that provides full coverage of the implement (for example, identifying which points of the route to raise / lower the implement to cover the field). In conventional systems, on the other hand, a user needs to set a limit when driving through the field. In addition, the user also needs to define any exclusion limits (obstacles) in the field and the limits are assumed to be static for a given field.
[0076] [0076] In some embodiments, the system can identify the vegetation to be planted by the vehicle on at least a portion of the land represented on the three-dimensional terrain map, determine a water management process to irrigate the vegetation, and determine the vehicle's path to plant the vegetation that corresponds with the water management process. Likewise, the system can determine a respective expected fuel consumption rate for each of a plurality of potential routes for the vehicle, and determine the vehicle's route based on the determined fuel consumption rates (for example, choosing the route that has the best fuel consumption rate).
[0077] [0077] In addition or alternatively, the system can determine a respective expected time for the vehicle to travel each of a plurality of potential routes for the vehicle, and determine the vehicle's route based on determined transit times (for example, selecting the route with the shortest time). In some embodiments, the selection of a route based on travel time / passage over a section terrain may depend on a specific terrain characteristic. For example, the system can determine a route to completely ignore the feature (for example, if it is easily acceptable) or take steps to avoid it (for example, if the feature is impassable, it would cause damage to the vehicle, cause the vehicle to get stuck, etc.). The vehicle's control system can also cause the vehicle to slow down and travel a longer distance to avoid a terrain feature. In some cases, the time difference can be significant (especially for a large field), and in some embodiments, the vehicle's control system can determine any additional time needed to avoid and report it to a human operator (for example, the field planner).
[0078] [0078] The system can compare an identified terrain characteristic based on sensor data to a corresponding terrain characteristic on the three-dimensional map and modify a corresponding terrain characteristic on the three-dimensional map based on the identified terrain characteristic.
[0079] [0079] Embodiments of the present disclosure can identify a boundary of an area to be crossed by the vehicle (for example, a fence around a field), determine a radius of curvature of the vehicle and determine the vehicle's path to cross the identified area within the radius of curvature of the vehicle and without colliding with the border. In this way, the system can help ensure that a vehicle and its implements safely cross the promontory of a field without deviating from obstacles or limits at the edge of the field.
[0080] [0080] The vehicle path can be determined based on the functions to be performed by one or more implements coupled to (or integrated with) a vehicle. For example, the system can identify one or more points along the path in which a feature of an implement attached to the vehicle is engaged or disengaged.
[0081] [0081] The system can modify or maintain the vehicle's route (425) based on a variety of criteria, including, based on: determining a waiting time for the vehicle to cross or avoid the identified terrain characteristic, and / or determine whether the identified land feature is passable by the vehicle (for example, a fence or lake versus a small ditch or stream).
[0082] [0082] The system can identify terrain characteristics (420) based on a variety of sensor data. For example, in a sensor system that includes an accelerometer, identifying the terrain characteristic may include identifying a level of terrain roughness based on data from the accelerometer as the vehicle passes over the terrain. In some embodiments, the system can adjust the vehicle's speed based on the roughness of the terrain.
[0083] [0083] In another example, in a sensor system that includes a gyroscopic sensor, identifying a feature of the terrain (for example, an elevation / hill) may include determining one or more of: a rolling angle for the vehicle, an angle of rotation by the vehicle, and a yaw angle for the vehicle. In alternative embodiments, other types of sensors (for example, an accelerometer) can also be used to determine a vehicle's attitude characteristics. IV. VEHICLE IMPLEMENT CONTROL
[0084] [0084] Figure 5 provides an example of a method 500 that can be used to control the characteristics and functions of a variety of vehicle implements. As with all methods described in Figures 3-6, the characteristics of method 500 can be practiced alone, in part or in conjunction with any of the other methods described herein. Method 500 can be performed by a vehicle implement control system (for example, B100 control system shown in Figure 1B). The vehicle implement control system can be separated or implemented by a vehicle control system.
[0085] [0085] Embodiments of the present disclosure can be implemented in conjunction with a variety of vehicle attachments, including (for example): a seeder, a fertilizer spreader, a plow, a disc, a combine, baler, rake, a cutter grass, a harrow, a sprout, a cultivator, a pesticide sprayer, a shredder, a grain cart, a trailer, a conditioner, and their combinations. The vehicle implement can be integrated with a vehicle (for example, as in the case of an alloy) or attached to a vehicle (for example, in the case of a tractor attached to a plow).
[0086] [0086] For example, a fertilizer distributor may need to adjust the distribution pattern a lot depending on the terrain to maintain an even distribution and cover the width. Embodiments of the present disclosure can control the operation of the fertilizer (or provide data on the land for the spreader itself) so that the spreader is adjusted accordingly. Likewise, modern fertilizer spreaders can adjust the width and amount of fertilizer in motion to perform variable rate precision farming applications, and the vehicle implement control system in this disclosure can help to improve and optimize the pattern of spreader distribution. In Figure 10, for example, a spreader is represented with a first section of the land (on the left) having a relatively higher elevation than the land on the right. In this example, the spread pattern can be adjusted by the system to spread fertilizers about 10 meters on the right side and a shorter distance on the left side, to take into account the difference in the terrain.
[0087] [0087] In the example shown in Figure 5, method 500 includes identifying one or more features of a section of terrain (for example, based on: a three-dimensional map, including the section of terrain, and data from a system of sensor) (505), determine a vehicle implement position (for example, based on data from the sensor system and the one or more identified land features) (510), and modify a vehicle implement function based on in one or more characteristics of the terrain identified and the position of the vehicle implement (515).
[0088] [0088] In some embodiments, the system may include a positioning system, and the positioning of the vehicle implement can (additionally or as an alternative to other data) be determined based on the data from the positioning system. In a particular example, the positioning system includes a global satellite navigation system (GNSS) and does not include an inertial navigation system (INS). Instead of using an INS, the system can identify one or more features of the terrain by comparing the three-dimensional terrain map with GNSS data and data from the sensor system. In some embodiments, the system can change the three-dimensional map in response to comparing the three-dimensional terrain map with GNSS data and data from the sensor system (for example, to update the 3D terrain map). The sensor system can include any suitable number and type of sensor, including a radar sensor, a Lidar sensor and / or an imaging device (such as a camera).
[0089] [0089] In cases where the vehicle implement is coupled to a vehicle, determining the position of the vehicle implement can be based on determining a size, shape, and weight for the vehicle implement, identifying an articulation angle between the vehicle and the vehicle implement.
[0090] [0090] In some embodiments, the vehicle implement may comprise a portion that is adjustable, and modifying the function of the vehicle implement (515) includes adjusting the vehicle implement portion. For example, a portion of a vehicle implement, such as a plow or disc, can be raised (to disengage with the ground) or lowered (to engage with the ground). The system can therefore raise or lower the vehicle's implement portion based on, for example, a height of a given terrain feature (for example, to avoid the feature with the implement and / or avoid damage to the implement) .
[0091] [0091] For example, for vehicle attachments used in harvesting applications with excavator control, the height of a portion of the implement height can be controlled more efficiently compared to conventional systems where such control is typically wheel-based or antennas that are close to the work point, but do not provide (or provide very little) the ability to look at the terrain to be covered.
[0092] [0092] In another example, where the vehicle implement is coupled to a vehicle, modifying the function of the vehicle implement may include the identification of a first vehicle path between the terrain section and the identification of a second implement path vehicle through the terrain section, where the first route and the second route are different. This can occur, for example, in cases where the vehicle is towing the implement behind the vehicle.
[0093] [0093] In such cases, the function of the vehicle implement can be modified based on the difference between the first leg and the second leg. For example, the system can move a part of the vehicle's implement to avoid collision with the terrain characteristic that is on the second leg (for the vehicle's implement) but not on the first leg (for the vehicle's implement). For example, the terrain feature may include an obstacle that can damage the implement or cause it to become trapped, such as a hole, a groove, a body of water, or an obstacle that extends above a plane of the ground. terrain (such as stone, tree or other vehicle).
[0094] [0094] In some embodiments, where the vehicle implement is coupled to a vehicle, determining the position of the vehicle implement can be additionally based on receiving, from a system coupled to the vehicle, a current vehicle speed and a current heading of the vehicle. For example, a vehicle control system coupled to the vehicle (for example, as shown in Figure 2) can communicate with the vehicle implement control system coupled to the vehicle control system (for example implemented using the Bl00O system in Figure 1B) using wireless transceivers coupled to both systems (for example, transceiver B160 in Figure 1B).
[0095] [0095] The system can determine the position of the vehicle's implement based on determining a current bearing of the vehicle's implement. The system can also determine that the vehicle's current bearing is different from the vehicle's current bearing. Such a case can occur when a vehicle towing a vehicle implement is making a turn.
[0096] [0096] In some cases, the assumption that a vehicle (such as a tractor) and a vehicle implement attached to the vehicle (such as a plow attached to the tractor) are in the same plane is not valid for fast-rolling terrain, particularly when the vehicle operates at faster driving speeds and in situations where the vehicle's attitude rolls this way or that due to a hole or groove. Figure 11 illustrates such an example, where a vehicle that tows an implement (such as a disc) has its left set of wheels in a groove as it moves forward. Embodiments of the present disclosure can use data from a 3D terrain map and data from a sensor system to determine how the terrain will slide and make adjustments (for example, on the vehicle path or vehicle speed) to deal with holes or grooves .
[0097] [0097] In some embodiments, the system can alleviate the need for a positioning system with GNSS when determining characteristics of the vehicle implement (such as size, shape, weight, geometry, etc.), and determining an angle of articulation between the vehicle implement and vehicle, and use data from a terrain map. In some embodiments, data from the sensor can be used by the system to determine a ground level surface model and the vehicle's implement control system can be used to help control how the implement sinks into the ground. The system can use the 3D terrain map to determine the route that the implement will follow in relation to the path of the vehicle coupled to the implement.
[0098] [0098] In some embodiments, the system can filter the level of details of the 3D terrain map based on the type of implement. For example, some implements may require very detailed information to control, while others (for example, wide implements) may require less detail. III. FORECASTING LAND TRAVERSABILITY FOR A VEHICLE
[0099] [0099] For many vehicles, particularly for agricultural vehicles, it is important to avoid damage to the fields when crossing parts of the land with excess moisture. For example, driving in muddy soft parts of the field will lead to extra compaction and deep tracks that are generally undesirable. It is also important that these vehicles avoid getting stuck in mud pools or other bodies of water to avoid time-consuming (and expensive) recovery efforts for the vehicle.
[0100] [0100] Furthermore, considering the expenses of many modern agricultural vehicles and their cost of operation, it is beneficial for operators of such vehicles to optimize the use of such vehicles. One factor that can have a considerable impact on the operational efficiency of an agricultural vehicle is the degree to which the vehicle's tracks or wheels slip (for example, due to mud and rain conditions) while following a specific route.
[0101] [0101] Among other things, embodiments of this disclosure can help to optimize the use of a vehicle by predicting the skidding of the vehicle's wheel on the route ahead of the vehicle. For example, the optimal wheel skid depends on the type of soil (for example, concrete, firm soil, cultivated land, or soft soil / sand), but is typically in the 8 to 15% skid range.
[0102] [0102] In some embodiments, the system can report the expected rate of wheel slippage over various points on a route to be followed by a vehicle. For a specific vehicle, the operator (or a vehicle control system that works in conjunction with disclosure embodiments) can adjust the wheel slip by changing tire pressure, changing the vehicle's weight, or changing the load.
[0103] [0103] For example, many modern vehicles allow tire pressure to be inflated or deflated during operation in the field. The weight of a vehicle or implement attached to the vehicle can be changed by changing the ballast (additional weights) on the vehicle. Weights can also be modified when planning some tasks better based on knowledge about soft spots in the field identified by embodiments of this disclosure.
[0104] [0104] For example, during harvesting, goods transported by trailer can be loaded in front of the trailer first to add more weight to the tractor and reduce the weight on the axles of the trailer. For some implements (for example, loaded on the 3-point hitch and on the ground when working on the ground), it is possible to increase the 3-point hitch and get more of the implement's weight on the tractor's rear axle.
[0105] [0105] In some cases, the vehicle load can be changed by, for example, planning tasks where the vehicle is bringing material (for example, fertilizer to the field) and gradually reducing the transported weight as the material is distributed, or the vehicle is removing material (for example, crop harvesting where a trailer is gradually filled with material). For example, the vehicle's path can thus be planned by the system to traverse sections of terrain that have higher levels of humidity when the vehicle is lighter.
[0106] [0106] In some scenarios, the system can redirect a vehicle's route to avoid a specific wet area in the field and a plan around it to avoid getting stuck and / or damaging the field.
[0107] [0107] Figure 6 illustrates an example of a method for predicting a vehicle to skid according to various aspects of the present disclosure. Method 600 can be performed using a vehicle control system (as described above). In this example, method 600 includes determining a vehicle position based on the location data of a positioning system (605), identifying a three-dimensional terrain map associated with the vehicle position (610), determining a route for the vehicle based on on the three-dimensional terrain map (615), determine, based on data from the sensor system and the three-dimensional terrain map, a humidity level associated with a section of terrain along the vehicle's path (620). Method 600 also includes, in response to the determination of the humidity level associated with the terrain section, performing one or more of: adjusting a vehicle feature before crossing the terrain section, and modifying the vehicle's route before crossing the terrain. terrain section (625), and measure the vehicle skid while crossing a terrain section (630).
[0108] [0108] In some embodiments, the system can determine whether the section of terrain is traversable by the vehicle without skidding, as well as predicting a degree of skidding (for example, as a percentage described above) that the vehicle is likely to experience when cross the section of the terrain. In some embodiments, the system can determine a fuel consumption rate associated with the degree of skidding. Fuel consumption rates beyond a predetermined limit can, for example, lead to the section of the land being considered non-passable due to the high amount of skidding and associated fuel consumption.
[0109] [0109] In addition to predicting the likely rate of skidding, the system can measure vehicle skidding while traversing the section of terrain (630). The skid rate can be recorded and added to the 3D terrain map to assist in planning future vehicle routes.
[0110] [0110] The system can adjust a variety of vehicle characteristics (625) in response to the humidity level determined in a section of terrain. For example, the system can inflate or deflate one or more tires attached to the vehicle. The system can also modify the vehicle's path (625) by, for example: identifying a first expected weight associated with the vehicle at a first point in the vehicle's path; identifying a second expected weight associated with the vehicle at a second point on the vehicle's path, the first weight being different than the second weight; and modifying the vehicle's path to cross the section of terrain when the vehicle is associated with the second expected weight.
[0111] [0111] For example, the second weight may be less than the first weight due to the consumption (for example, fuel) or distribution (for example, seed or fertilizer) of a material carried by the vehicle or a vehicle implement attached to the vehicle along the vehicle's path. In contrast, the second weight may be greater than the first weight due to the addition of a material carried by the vehicle or an implement of the vehicle coupled to the vehicle along the vehicle's path, such as crops harvested along the path traveled by the vehicle and implement. In this way, the system can plan for a vehicle to cross a particularly wet section of a field when it is at the lowest weight (to avoid sinking) or to cross the section with the heaviest weight to help give traction to the vehicle or its implements to go through the section. The system can also modify the vehicle's route to avoid the section of terrain altogether.
[0112] [0112] The system can identify the humidity level in a section of the land based on data from a variety of sensors. In some embodiments, for example, the sensor system includes an imaging device, and determining the humidity level associated with a section of terrain includes: capturing a first image of at least a portion of the section of terrain at a first resolution using the imaging device; capture a second image of at least a portion of the terrain section at a second resolution, using the imaging device; capture a third image of at least a portion of the terrain section in a third resolution, using the imaging device, where the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution; and georeferencing the first, second, and third images based on data from the positioning system and the three-dimensional terrain map.
[0113] [0113] In some embodiments, in addition to (or as an alternative to) identifying the moisture level of a section of the terrain, the system can determine the suitability of crossing the section of the terrain based on other characteristics of the terrain. For example, such a determination can be made based on operator comfort and / or wear on the vehicle or implement (for example, based on a determination of the roughness of the soil, particularly avoiding rough terrain that would be uncomfortable for the operator and would cause damage to equipment through excessive shaking and vibration). In another example, the system can analyze the type of soil in a section of the terrain (for example, based on data from the 3D terrain map or sensor system) to determine whether to cross a section of the terrain. In a specific example, the system may choose to avoid crossing very sandy soil in favor of crossing a nearby piece of gravel to prevent the vehicle's wheels from skidding.
[0114] [0114] Such images can be taken of regions of interest in front of the vehicle - typically along the planned route for the vehicle. An example could be to take a high resolution image fragment in front of the vehicle, a medium resolution image fragment further away and a third low resolution fragment further away. The images are georeferenced so that they can be correlated with the slippage measured at that location.
[0115] [0115] In some embodiments, determining the moisture level associated with the terrain section includes identifying a depression in the terrain section based on the three-dimensional terrain map. The humidity level can also be determined based on the analysis of climate data, indicating an actual or predicted level of precipitation associated with the section of land. Determining the humidity level associated with the terrain section may also include performing an image recognition process on an image of the terrain section captured by the image capture device (for example, to identify standing water from the surrounding soil).
[0116] [0116] In some embodiments, the geometry (elevation) of the field can be measured and georeferenced. This can be based on data from a GNSS or INS, data from a 3D terrain map, or from data from sensors, such as LIDAR or stereo cameras. The slippage corresponding to the image locations can be measured, geo-referenced and used as a label to train a skid forecasting model. Additional feature entries can be used to train the skid prediction model, including vehicle features.
[0117] [0117] For example, the current tire pressure in the vehicle, the current vertical load on the vehicle axle and / or the current load (for example, engine load, PTO load and / or traction load) can, each one, be georeferenced and registered and used as training characteristics for the model. Other input characteristics may include the model / type, the model / type of vehicle implement, the load on a trailer (for example, based on sales by weight or level of sprayers or suspension distributors), the depth to that the vehicle is sinking into the ground (for example, measured by terrain sensors on stable ground), the speed of the vehicle, a task being performed by the vehicle and / or implement of the vehicle, the type of harvest being planted, cared for or harvested, and / or other characteristics.
[0118] [0118] An axle load on the driving wheels and the load pulled can change during operation due to both the rough surface and variations in the ground. The load can also change due to the load material in the vehicle or outside the vehicle. For example, for an implement attached to a vehicle (for example, a tractor) to work the soil, the load may depend on the geometry of the field, speed, soil conditions and other factors.
[0119] [0119] The weight may depend on how much of the implement's weight is carried by the tractor and how much of the tractive forces that are giving a resulting force downward on the driving axles.
[0120] [0120] Skid measurements for different vehicles can also be used in training a skid forecast model. The slip entry provided for a specific vehicle may be different for another vehicle with different loads, tires, etc. Thus, the training data collected by the system can be processed to be independent of the vehicle and normalized. For example, if a load is changing, that data entry can be taken into account before training.
[0121] [0121] The expected rate of skidding for a vehicle can be determined from a variety of data sources. For example, the skid rate can be determined based on data from a camera / sensor or farm management information system along the vehicle's path. The prediction values of these data can be calibrated based on the actual measured slip for the current state of the machine (for example, current tire wear, load, tire pressure, weight distribution, etc.). EXAMPLES
[0122] [0122] The following refers to examples of embodiments of the present disclosure. Any of the following examples can be combined with any other example (or combination of examples), unless explicitly stated otherwise. The foregoing description of one or more implementations provides illustration and description, but is not intended to be exhaustive or to limit the scope of the embodiments to the precise form disclosed. The modifications and variations are possible in the light of the teachings mentioned above or can be acquired from the practice of various embodiments.
[0123] [0123] Example 81 may include an apparatus comprising means for carrying out one or more elements of a method described in or related to any of Examples 1-80, or any other method or process described herein.
[0124] [0124] Example 82 may include one or more non-transitory, computer-readable means comprising instructions to cause an electronic device, upon execution of instructions by one or more processors of the electronic device, to perform one or more elements of a method described in or related to any of Examples 1-80, or any other method or process described herein.
[0125] [0125] Example 83 may include an apparatus comprising logic, modules, or circuitry for carrying out one or more elements of a method described in, or related to any of examples 1-80, or any other method or process described herein.
[0126] [0126] Example 84 may include a method, technique or process as described in, or related to, any of Examples 1-80, or portions or parts thereof.
[0127] [0127] Example 85 may include an apparatus comprising: one or more processors and one or more computer-readable media comprising instructions which, when executed by one or more processors, cause one or more processors to perform the method, techniques, or process, as described in, or related to, any of Examples 1-80, or portions thereof.
[0128] [0128] Example 86 can include a vehicle control system, a vehicle implement control system, or a terrain mapping system adapted to perform a method, technique or process as described in, or related to, any one of examples 1-80, or portions or parts thereof.
[0129] [0129] Some of the "operations" described above can be implemented in software and other operations can be implemented in hardware. One or more of the operations, processes or methods described here can be performed by an apparatus, device or system similar to those described here and with reference to the figures illustrated.
[0130] [0130] "Computer-readable storage media" (or alternatively, "machine-readable storage media") used in the control system 100 may include any type of memory, as well as new technologies that may arise in the future, as long as they may be able to store digital information in the nature of a computer program or other data, at least temporarily, so that the stored information can be "read" by an appropriate processing device. The term "computer-readable" cannot be limited to the historical use of "computer" to imply a complete mainframe, minicomputer, desktop, wireless device or even a laptop. Instead, "computer-readable" can comprise a storage medium that can be readable by a processor, processing device, or any computing system. Such media can be any available media that can be accessed locally and / or remotely by a computer or processor and may include volatile and non-volatile media and removable and non-removable media.
[0131] [0131] Examples of systems, devices, computer-readable storage media and methods are provided only to add context and assist in understanding the disclosed implementations. Thus, it will be evident to a person skilled in the art that the disclosed implementations can be practiced without some or all of the specific details provided. In other cases, certain processes or methods also referred to here as "blocks" have not been described in detail to avoid unnecessarily obscuring the disclosed implementations. Other implementations and applications are also possible and, as such, the following examples should not be taken as definitive or limiting in scope or configuration.
[0132] [0132] Reference has been made to the attached drawings, which are part of the description and in which specific implementations are shown by way of illustration. Although these disclosed implementations are described in sufficient detail to allow a person skilled in the art to practice the implementations, it should be understood that these examples are not limiting, so that other implementations can be used and that changes can be made to the disclosed implementations without leaving of your spirit and scope. For example, the blocks of the methods shown and described are not necessarily executed in the order indicated in some other implementations. In addition, in other implementations, the disclosed methods may include more or less blocks than those described. As another example, some blocks described here as separate blocks can be combined in some other implementations. On the other hand, what can be described here as a single block can be implemented in several blocks in some other implementations. In addition, the conjunction "or" is intended here in an inclusive sense, where appropriate, unless otherwise indicated; that is, the phrase "A, B or C" should include the possibilities of "A", "B", "C", "Ae B", "BeC", "AeC" and "A, B ecC".
[0133] [0133] Having described and illustrated the principles of a preferred embodiment, it should be apparent that the embodiments can be modified in layout and detail without departing from those principles. A claim is made for all modifications and variations within the spirit and scope of the following claims.
权利要求:
Claims (20)
[1]
1. Terrain mapping system for a vehicle characterized by the fact that it comprises: a processor; a sensor system coupled to the processor to collect three-dimensional terrain data; a digital camera coupled to the processor to capture image data from the terrain; a positioning system coupled to the processor to determine location data for the vehicle; and memory attached to the processor and storage instructions that, when executed by the processor, cause the terrain mapping system to perform operations comprising: identifying, based on the data received from the sensor system, digital camera and positioning system, a topography from the soil surface to a section of the land; identify, based on the data received from the sensor system, digital camera and positioning system, a topography of the vegetation in the section of the terrain; and generate a two-dimensional representation of a three-dimensional terrain map, the three-dimensional terrain map including the topography of the soil surface for the terrain section and the vegetation topography in the terrain section.
[2]
2. Terrain mapping system, according to claim 1, characterized by the fact that the generation of the three-dimensional terrain map includes identifying a plurality of objects within the terrain section in the terrain map, and presenting a respective visual indicator in the map for each respective object that represents the object that is traversable by the vehicle.
[3]
3. Terrain mapping system, according to claim 1, characterized by the fact that it also comprises: a display screen coupled to the processor, in which the memory additionally stores instructions to make the terrain mapping system display the three-dimensional terrain map on the display screen.
[4]
4. Terrain mapping system, according to claim 1, characterized by the fact that the memory additionally stores instructions for transmitting an electronic communication comprising the three-dimensional map to a vehicle control system.
[5]
5. Terrain mapping system, according to claim 1, characterized by the fact that the positioning system comprises a global satellite navigation system (GNSS) or a local positioning system (LPS).
[6]
6. Terrain mapping system, according to claim 1, characterized by the fact that the sensor system includes one or more of: a radar sensor, a handle sensor and an image device.
[7]
7. Terrain mapping system, according to claim 1, characterized by the fact that the generation of the three-dimensional terrain map includes identifying the height of a portion of the vehicle above the ground surface.
[8]
8. Terrain mapping system, according to claim 7, characterized by the fact that the generation of the three-dimensional terrain map includes the determination of a depth of the tracks made by the vehicle based on a change in the height of the vehicle portion above of the soil surface.
[9]
9. Terrain mapping system, according to claim 1, characterized by the fact that the generation of the three-dimensional map of the terrain includes the determination of a height of a portion of the vegetation in the terrain above the soil surface.
[10]
10. Terrain mapping system, according to claim 1, characterized by the fact that the generation of the three-dimensional map of the terrain includes modifying a pre-existing characteristic of a three-dimensional map of the pre-existing terrain based on the topography of the soil surface for the section of land and the topography of vegetation in the land section.
[11]
11. Terrain mapping system, according to claim 1, characterized by the fact that the three-dimensional terrain map is still generated based on data from a sensor system coupled to a second vehicle.
[12]
12. Terrain mapping system, according to claim 1, characterized by the fact that the generation of the three-dimensional map of the terrain includes identifying a humidity level in the terrain section.
[13]
13. Terrain mapping system according to claim 12, characterized by the fact that identifying the humidity level in the terrain section includes identifying a first humidity level in a first portion of the terrain section and identifying a second level of humidity moisture in a second portion of the land section, and where the first level of moisture is different from the second level of moisture.
[14]
14. Terrain mapping system, according to claim 12, characterized by the fact that the identification of the humidity level in the terrain section includes the identification of a body of water in the terrain section.
[15]
15. Terrain mapping system, according to claim 14, characterized by the fact that identifying the humidity level in the terrain section includes determining whether the vehicle is capable of crossing the body of water.
[16]
16. Terrain mapping system according to claim 14, characterized by the fact that identifying the humidity level in the terrain section includes determining a rate of water flow through the water body.
[17]
17. Terrain mapping system according to claim 14, characterized by the fact that identifying the humidity level in the terrain section includes determining a depth of the water body.
[18]
18. Terrain mapping system, according to claim 14, characterized by the fact that the generation of the three-dimensional map of the terrain includes identifying a path for the vehicle to bypass the body of water.
[19]
19. Non-transitory and tangible computer-readable medium storing instructions characterized by the fact that when executed by a terrain mapping system, they cause the terrain mapping system to perform operations comprising: identifying, based on data received from a system sensor, digital camera and positioning system, topography of the soil surface for a section of the terrain; identify, based on the data received from the sensor system, digital camera and positioning system, a topography of the vegetation in the section of the terrain; and generate a two-dimensional representation of a three-dimensional terrain map, the three-dimensional terrain map including the topography of the soil surface for the terrain section and the vegetation topography in the terrain section.
[20]
20. Method characterized by the fact that it comprises:
identify, by a terrain mapping system based on the data received from a sensor system, a digital camera and a positioning system, a topography of the soil surface for a section of the terrain;
identify, using the terrain mapping system based on the data received from the sensor system, digital camera and positioning system, a vegetation topography in the terrain section; and to generate, through the terrain mapping system, a two-dimensional representation of a three-dimensional terrain map, the three-dimensional terrain map including the topography of the soil surface for the terrain section and the vegetation topography in the terrain section.
类似技术:
公开号 | 公开日 | 专利标题
BR112020008778A2|2020-10-20|terrain mapping system, computer readable medium and method
JP6634484B2|2020-01-22|Agricultural cultivation system and operation method of agricultural drone
US20210112704A1|2021-04-22|Real-time field mapping for autonomous agricultural platform
EP2744670B1|2020-12-16|Vehicle soil pressure management based on topography
EP2744322B1|2019-11-27|Soil compaction management and reporting
BRPI1106300A2|2014-01-28|METHOD AND SYSTEM FOR MAINTAINING VISION LINE COMMUNICATION AMONG A PLURALITY OF MACHINES AND, COMPUTER PROGRAM PRODUCT
US9945832B2|2018-04-17|Apparatus and method to determine ground properties by traction anchors and sensors
US20130046446A1|2013-02-21|Vehicle stability and traction through v-foot shape change
KR20200096503A|2020-08-12|Slip determination system, travel path generation system, and pavement vehicle
US20130054078A1|2013-02-28|Dynamic traction adjustment
US20130046418A1|2013-02-21|V-foot tire management at fleet level
US20190236359A1|2019-08-01|Vision-based system for acquiring crop residue data and related calibration methods
DE102011078292A1|2013-01-03|Method for generating trafficability chart of surrounding area of vehicle, involves detecting image of surrounding area of vehicle by image sensor unit
JP2017131199A|2017-08-03|Tree planting system, tree planting method, farming machine, planting machine and management device
US11140813B1|2021-10-12|Moisture and vegetative health mapping
US10729055B2|2020-08-04|System and method for determining swath connections
Williams et al.2011|Capture of plateau runoff by global positioning system–guided seed drill operation
US20210095963A1|2021-04-01|System and method for tile cached visualizations
JP2020201578A|2020-12-17|Farm field management system
JP2021003070A|2021-01-14|Agricultural field work machine, plowed soil depth information calculation system, plowed soil depth information calculation program, and storage medium
JP6956620B2|2021-11-02|Travel route generation system and field work vehicle
BR112019021414A2|2020-05-05|compensation of the orientation work depth
Rodzik et al.2014|The mechanism and stages of development of a field road gully in relation to changes in the surrounding land relief
Rodzik et al.2014|Mechanizm i stadia rozwoju śródpolnego wąwozu drogowego na tle zmian ukształtowania powierzchni w jego otoczeniu
Olsen1919|Terracing in Texas.
同族专利:
公开号 | 公开日
WO2019089853A1|2019-05-09|
US20190129430A1|2019-05-02|
EP3704443A1|2020-09-09|
US20190124819A1|2019-05-02|
US11193781B2|2021-12-07|
US20190128690A1|2019-05-02|
US20190129435A1|2019-05-02|
US10866109B2|2020-12-15|
CA3079244A1|2019-05-09|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US1025132A|1911-06-30|1912-05-07|William M Douglas|Automatic gun or rifle.|
US1025567A|1912-01-26|1912-05-07|Albaugh Dover Co|Liner for centrifugal liquid-separators.|
US4495500A|1982-01-26|1985-01-22|Sri International|Topographic data gathering method|
US5663879A|1987-11-20|1997-09-02|North American Philips Corporation|Method and apparatus for smooth control of a vehicle with automatic recovery for interference|
US5438517A|1990-02-05|1995-08-01|Caterpillar Inc.|Vehicle position determination system and method|
US5194851A|1991-02-21|1993-03-16|Case Corporation|Steering controller|
FI942218A0|1994-05-13|1994-05-13|Modulaire Oy|Automatic storage system Foer obemannat fordon|
US6070673A|1996-11-22|2000-06-06|Case Corporation|Location based tractor control|
CA2283904C|1997-03-21|2007-01-09|The Board Of Trustees Of The Leland Stanford Junior University|A system using leo satellites for centimeter-level navigation|
US6052647A|1997-06-20|2000-04-18|Stanford University|Method and system for automatic control of vehicles based on carrier phase differential GPS|
US6212453B1|1998-09-11|2001-04-03|Honda Giken Kogyo Kabushiki Kaisha|Vehicle steering control system|
AUPP679598A0|1998-10-27|1998-11-19|Agsystems Pty Ltd|A vehicle navigation apparatus|
US6445983B1|2000-07-07|2002-09-03|Case Corporation|Sensor-fusion navigator for automated guidance of off-road vehicles|
US6377889B1|2000-10-13|2002-04-23|Trimble Navigation Limited|Non-linear method of guiding to arbitrary curves with adaptive feedback|
US6711501B2|2000-12-08|2004-03-23|Satloc, Llc|Vehicle navigation system and method for swathing applications|
US6539303B2|2000-12-08|2003-03-25|Mcclure John A.|GPS derived swathing guidance system|
US6819780B2|2001-02-02|2004-11-16|Cnh America Llc|Method and apparatus for automatically steering a vehicle in an agricultural field using a plurality of fuzzy logic membership functions|
AUPR430301A0|2001-04-09|2001-05-17|Agsystems Pty Ltd|Soil-cultivation implement control apparatus and method|
AUPR733701A0|2001-08-29|2001-09-20|Beeline Technologies|Apparatus and method for assisted navigation of a land vehicle|
US6865465B2|2002-05-06|2005-03-08|Csi Wireless, Inc.|Method and system for implement steering for agricultural vehicles|
US7885745B2|2002-12-11|2011-02-08|Hemisphere Gps Llc|GNSS control system and method|
US7162348B2|2002-12-11|2007-01-09|Hemisphere Gps Llc|Articulated equipment position control system and method|
US8583315B2|2004-03-19|2013-11-12|Agjunction Llc|Multi-antenna GNSS control system and method|
US8634993B2|2003-03-20|2014-01-21|Agjunction Llc|GNSS based control for dispensing material from vehicle|
US8594879B2|2003-03-20|2013-11-26|Agjunction Llc|GNSS guidance and machine control|
US7142956B2|2004-03-19|2006-11-28|Hemisphere Gps Llc|Automatic steering system and method|
US8190337B2|2003-03-20|2012-05-29|Hemisphere GPS, LLC|Satellite based vehicle guidance control in straight and contour modes|
US7437230B2|2003-03-20|2008-10-14|Hemisphere Gps Llc|Satellite based vehicle guidance control in straight and contour modes|
US8214111B2|2005-07-19|2012-07-03|Hemisphere Gps Llc|Adaptive machine control system and method|
US7400956B1|2003-03-20|2008-07-15|Hemisphere Gps Inc.|Satellite position and heading sensor for vehicle steering control|
US7689354B2|2003-03-20|2010-03-30|Hemisphere Gps Llc|Adaptive guidance system and method|
US9002565B2|2003-03-20|2015-04-07|Agjunction Llc|GNSS and optical guidance and machine control|
US8639416B2|2003-03-20|2014-01-28|Agjunction Llc|GNSS guidance and machine control|
US6789014B1|2003-05-09|2004-09-07|Deere & Company|Direct modification of DGPS information with inertial measurement data|
US7272474B1|2004-03-31|2007-09-18|Carnegie Mellon University|Method and system for estimating navigability of terrain|
US20060167600A1|2005-01-27|2006-07-27|Raven Industries, Inc.|Architecturally partitioned automatic steering system and method|
US7865285B2|2006-12-27|2011-01-04|Caterpillar Inc|Machine control system and method|
US8768558B2|2007-01-05|2014-07-01|Agjunction Llc|Optical tracking vehicle control system and method|
US7835832B2|2007-01-05|2010-11-16|Hemisphere Gps Llc|Vehicle control system|
WO2009100463A1|2008-02-10|2009-08-13|Hemisphere Gps Llc|Visual, gnss and gyro autosteering control|
US8018376B2|2008-04-08|2011-09-13|Hemisphere Gps Llc|GNSS-based mobile communication system and method|
US8386129B2|2009-01-17|2013-02-26|Hemipshere GPS, LLC|Raster-based contour swathing for guidance and variable-rate chemical application|
US8126620B2|2009-04-28|2012-02-28|Cnh America Llc|Grain transfer control system and method|
US8311696B2|2009-07-17|2012-11-13|Hemisphere Gps Llc|Optical tracking vehicle control system and method|
US8401704B2|2009-07-22|2013-03-19|Hemisphere GPS, LLC|GNSS control system and method for irrigation and related applications|
US20120139755A1|2009-08-11|2012-06-07|On Time Systems, Inc.|Automatic Detection of Road Conditions|
US8306672B2|2009-09-09|2012-11-06|GM Global Technology Operations LLC|Vehicular terrain detection system and method|
US8649930B2|2009-09-17|2014-02-11|Agjunction Llc|GNSS integrated multi-sensor control system and method|
US8548649B2|2009-10-19|2013-10-01|Agjunction Llc|GNSS optimized aircraft control system and method|
US9173337B2|2009-10-19|2015-11-03|Efc Systems, Inc.|GNSS optimized control system and method|
EP2504663A1|2009-11-24|2012-10-03|Telogis, Inc.|Vehicle route selection based on energy usage|
US8583326B2|2010-02-09|2013-11-12|Agjunction Llc|GNSS contour guidance path selection|
WO2011150351A2|2010-05-28|2011-12-01|Gvm, Inc.|System and method for collecting and processing agricultural field data|
US8489291B2|2010-07-12|2013-07-16|Hemisphere Gps Llc|System and method for collecting soil samples|
US8897483B2|2010-11-09|2014-11-25|Intelescope Solutions Ltd.|System and method for inventorying vegetal substance|
US8803735B2|2010-11-19|2014-08-12|Agjunction Llc|Portable base station network for local differential GNSS corrections|
US9127948B2|2011-03-29|2015-09-08|Raytheon Company|Path determination using elevation data|
DK2527648T3|2011-05-25|2019-02-18|Siemens Ag|Method for determining a wind turbine location|
US8843269B2|2011-08-17|2014-09-23|Deere & Company|Vehicle soil pressure management based on topography|
US8589013B2|2011-10-25|2013-11-19|Jaybridge Robotics, Inc.|Method and system for dynamically positioning a vehicle relative to another vehicle in motion|
US8781685B2|2012-07-17|2014-07-15|Agjunction Llc|System and method for integrating automatic electrical steering with GNSS guidance|
US9117185B2|2012-09-19|2015-08-25|The Boeing Company|Forestry management system|
US10131376B2|2013-01-30|2018-11-20|Agjunction Llc|Steering controller for precision farming|
US9162703B2|2013-01-30|2015-10-20|AgJunction, LLC|Steering controller for precision farming|
US9781915B2|2013-03-14|2017-10-10|Agjunction Llc|Implement and boom height control system and method|
US9945957B2|2013-03-14|2018-04-17|Agjunction Llc|Machine control system and method|
US9223314B2|2013-03-14|2015-12-29|Aglunction, LLC|Hovering control for helicopters using a GNSS vector|
US20140266877A1|2013-03-15|2014-09-18|Agjunction Llc|Precision accuracy global navigation satellite system with smart devices|
US9733643B2|2013-12-20|2017-08-15|Agjunction Llc|Hydraulic interrupter safety system and method|
DE102014208068A1|2014-04-29|2015-10-29|Deere & Company|Harvester with sensor-based adjustment of a working parameter|
US9420737B2|2014-08-27|2016-08-23|Trimble Navigation Limited|Three-dimensional elevation modeling for use in operating agricultural vehicles|
US9454153B2|2014-11-24|2016-09-27|Trimble Navigation Limited|Farm vehicle autopilot with automatic calibration, tuning and diagnostics|
US9857478B2|2015-01-27|2018-01-02|Agjunction Llc|Apparatus and method to mount steering actuator|
WO2016142858A1|2015-03-11|2016-09-15|Trailze, Ltd.|Automatically creating a terrain mapping database|
US20210148708A1|2015-03-11|2021-05-20|Trailze Ltd|Automatically creating a terrain mapping database|
US9715016B2|2015-03-11|2017-07-25|The Boeing Company|Real time multi dimensional image fusing|
BR102016008666A2|2015-05-12|2016-11-16|Autonomous Solutions Inc|base station control system, method for controlling an agricultural vehicle and standalone agricultural system|
US9745060B2|2015-07-17|2017-08-29|Topcon Positioning Systems, Inc.|Agricultural crop analysis drone|
US20170127606A1|2015-11-10|2017-05-11|Digi-Star, Llc|Agricultural Drone for Use in Controlling the Direction of Tillage and Applying Matter to a Field|
US9703290B1|2016-02-13|2017-07-11|Caterpillar Inc.|Method for operating machines on worksites|
CN113238581A|2016-02-29|2021-08-10|星克跃尔株式会社|Method and system for flight control of unmanned aerial vehicle|
US10206324B2|2016-03-11|2019-02-19|Steven R. Gerrish|Autonomous agricultural robot for decision making and courses of action considering real-time conditions|
US10251329B2|2016-06-10|2019-04-09|Cnh Industrial Canada, Ltd.|Planning and control of autonomous agricultural operations|
US10040476B2|2016-11-02|2018-08-07|Caterpillar Inc.|Autonomous steering system for an articulated truck|
US10255670B1|2017-01-08|2019-04-09|Dolly Y. Wu PLLC|Image sensor and module for agricultural crop improvement|
US10761544B2|2017-10-13|2020-09-01|Deere & Company|Unmanned aerial vehicle -assisted worksite operations|
US20190114847A1|2017-10-13|2019-04-18|Deere & Company|Unmanned aerial vehicle -assisted worksite data acquisition|
EP3704443A1|2017-10-31|2020-09-09|Agjunction LLC|Three-dimensional terrain mapping|DE102015118767A1|2015-11-03|2017-05-04|Claas Selbstfahrende Erntemaschinen Gmbh|Environment detection device for agricultural machine|
EP3704443A1|2017-10-31|2020-09-09|Agjunction LLC|Three-dimensional terrain mapping|
US10687466B2|2018-01-29|2020-06-23|Cnh Industrial America Llc|Predictive header height control system|
US20190279512A1|2018-03-12|2019-09-12|Ford Global Technologies, Llc.|Vehicle cameras for monitoring off-road terrain|
WO2019195954A1|2018-04-08|2019-10-17|大连理工大学|Landform observation apparatus based on mechanical fine adjustment of dual-host|
US10883256B2|2018-05-25|2021-01-05|Deere & Company|Object responsive control system for a work machine|
US11240961B2|2018-10-26|2022-02-08|Deere & Company|Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity|
US11178818B2|2018-10-26|2021-11-23|Deere & Company|Harvesting machine control system with fill level processing based on yield data|
US10955841B2|2018-12-28|2021-03-23|At&T Intellectual Property I, L.P.|Autonomous vehicle sensor security system|
US11079725B2|2019-04-10|2021-08-03|Deere & Company|Machine control using real-time model|
US11234366B2|2019-04-10|2022-02-01|Deere & Company|Image selection for machine control|
US10849264B1|2019-05-21|2020-12-01|Farmobile Llc|Determining activity swath from machine-collected worked data|
CN110539794B|2019-09-24|2021-07-20|湖北航天技术研究院特种车辆技术中心|Vehicle steering control method and system|
US20210089027A1|2019-09-25|2021-03-25|Cnh Industrial America Llc|System and method for providing a visual indicator of field surface profile|
WO2021089813A2|2019-11-08|2021-05-14|Kverneland Group Operations Norway As|System for measuring and interpreting a force|
EP3818802A1|2019-11-08|2021-05-12|Kverneland Group Operations Norway AS|System for measuring and interpreting a force|
WO2021136234A1|2020-01-03|2021-07-08|苏州宝时得电动工具有限公司|Self-moving device and automatic moving and working method therefor, and storage medium|
EP3854615A1|2020-01-21|2021-07-28|CNH Industrial Italia S.p.A.|Agricultural vehicle with an active suspension system based on soil surface 3d mapping|
DE102020102330A1|2020-01-30|2021-08-05|CLAAS Tractor S.A.S|tractor|
US20210241390A1|2020-01-31|2021-08-05|Deere & Company|Systems and methods for site traversability sensing|
DE102020109936A1|2020-04-09|2021-10-14|Claas E-Systems Gmbh|Steering system for a commercial vehicle|
DE102020111958A1|2020-05-04|2021-11-04|Amazonen-Werke H. Dreyer SE & Co. KG|Method for determining a movement path for an agricultural machine|
DE102020117477A1|2020-07-02|2022-01-05|Claas E-Systems Gmbh|System for determining the position of a camera of a camera arrangement with respect to a ground plane|
法律状态:
2021-11-23| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201762579515P| true| 2017-10-31|2017-10-31|
US62/579,515|2017-10-31|
PCT/US2018/058586|WO2019089853A1|2017-10-31|2018-10-31|Three-dimensional terrain mapping|
[返回顶部]